AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Pretrained model

# Pretrained model

Sparklerl 7B Stage1
This is a transformers model published on the Hub. Specific functions and detailed information are to be supplemented.
Large Language Model Transformers
S
sparkle-reasoning
1,551
1
TILDE
TILDE is a model based on the BERT architecture, mainly used for text retrieval and language modeling tasks.
Large Language Model Transformers
T
ielab
134
3
Bert Laos Base Uncased
This is a pretrained model for Lao word segmentation, designed to handle Lao text segmentation tasks.
Sequence Labeling Transformers
B
GKLMIP
31
1
Mlm Spanish Roberta Base
Spanish pretrained language model based on RoBERTa architecture, focusing on masked language modeling tasks
Large Language Model Transformers Spanish
M
MMG
21
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase